Goto

Collaborating Authors

 East Flanders



ResilientConstrainedLearning

Neural Information Processing Systems

When deploying machine learning solutions, they must satisfy multiple requirementsbeyondaccuracy,suchasfairness,robustness,orsafety.




6454dcd80b5373daaa97e53ce32c78a1-Paper-Conference.pdf

Neural Information Processing Systems

Wepropose twoinnovativealgorithms, DP-GLMtron and DP-TAGLMtron, that outperform the conventional DPSGD. Inlight ofthevast quantities of personal and sensitiveinformation involved, traditional methods of ensuring privacy are encountering significant challenges.





RegBN: Batch Normalization of Multimodal Data with Regularization

Neural Information Processing Systems

However, the integration of heterogeneous multimodal data poses a significant challenge, as confounding effects and dependencies among such heterogeneous data sources introduce unwanted variability and bias, leading to suboptimal performance of multimodal models. Therefore, it becomes crucial to normalize the low-or high-level features extracted from data modalities before their fusion takes place.


The Entropic Signature of Class Speciation in Diffusion Models

Handke, Florian, Stančević, Dejan, Koulischer, Felix, Demeester, Thomas, Ambrogioni, Luca

arXiv.org Machine Learning

Diffusion models do not recover semantic structure uniformly over time. Instead, samples transition from semantic ambiguity to class commitment within a narrow regime. Recent theoretical work attributes this transition to dynamical instabilities along class-separating directions, but practical methods to detect and exploit these windows in trained models are still limited. We show that tracking the class-conditional entropy of a latent semantic variable given the noisy state provides a reliable signature of these transition regimes. By restricting the entropy to semantic partitions, the entropy can furthermore resolve semantic decisions at different levels of abstraction. We analyze this behavior in high-dimensional Gaussian mixture models and show that the entropy rate concentrates on the same logarithmic time scale as the speciation symmetry-breaking instability previously identified in variance-preserving diffusion. We validate our method on EDM2-XS and Stable Diffusion 1.5, where class-conditional entropy consistently isolates the noise regimes critical for semantic structure formation. Finally, we use our framework to quantify how guidance redistributes semantic information over time. Together, these results connect information-theoretic and statistical physics perspectives on diffusion and provide a principled basis for time-localized control.